A modified discrepancy principle to attain optimal convergence rates under unknown noise

نویسندگان

چکیده

We consider a linear ill-posed equation in the Hilbert space setting. Multiple independent unbiased measurements of right hand side are available. A natural approach is to take average as an approximation and estimate data error inverse square root number measurements. calculate optimal convergence rate (as tends infinity) under classical source conditions introduce modified discrepancy principle, which asymptotically attains this rate.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence rates for Morozov’s Discrepancy Principle using Variational Inequalities

We derive convergence rates for Tikhonov-type regularization with convex penalty terms, where the regularization parameter is chosen according to Morozov’s discrepancy principle and variational inequalities are used to generalize classical source and nonlinearity conditions. Rates are obtained first with respect to the Bregman distance and a Taylor-type distance and those results are combined t...

متن کامل

Further convergence results on the general iteratively regularized Gauss-Newton methods under the discrepancy principle

We consider the general iteratively regularized Gauss-Newton methods xk+1 = x0 − gαk (F (xk)F (xk))F (xk) ( F (xk)− y − F (xk)(xk − x0) ) for solving nonlinear inverse problems F (x) = y using the only available noise yδ of y satisfying ‖yδ − y‖ ≤ δ with a given small noise level δ > 0. In order to produce reasonable approximation to the sought solution, we terminate the iteration by the discre...

متن کامل

Optimal rates of aggregation in classification under low noise assumption

Let (X ,A) be a measurable space. We consider a random variable (X,Y ) on X × {−1,1} with probability distribution denoted by π . Denote by P the marginal of π on X and by η(x) def = P(Y = 1|X = x) the conditional probability function of Y = 1, knowing that X = x. We have n i.i.d. observations of the couple (X,Y ) denoted by Dn = ((Xi, Yi))i=1,...,n. The aim is to predict the output label Y for...

متن کامل

Optimal rates for stochastic convex optimization under Tsybakov noise condition

We focus on the problem of minimizing a convex function f over a convex set S given T queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer xf,S , as quantified by a Tsybakov-like noise condition. Specifically, we prove that if f grows at least as fast as ‖x − xf,S‖ around its...

متن کامل

A new discrepancy principle

The aim of this note is to prove a new discrepancy principle. The advantage of the new discrepancy principle compared with the known one consists of solving a minimization problem (see problem (2) below) approximately, rather than exactly, and in the proof of a stability result. To explain this in more detail, let us recall the usual discrepancy principle, which can be stated as follows. Consid...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Inverse Problems

سال: 2021

ISSN: ['0266-5611', '1361-6420']

DOI: https://doi.org/10.1088/1361-6420/ac1775